Multi-Step Gradient Methods for Networked Optimization
نویسندگان
چکیده
منابع مشابه
Adaptive Step-Size for Policy Gradient Methods
In the last decade, policy gradient methods have significantly grown in popularity in the reinforcement–learning field. In particular, they have been largely employed in motor control and robotic applications, thanks to their ability to cope with continuous state and action domains and partial observable problems. Policy gradient researches have been mainly focused on the identification of effe...
متن کاملSpectral gradient methods for linearly constrained optimization
Linearly constrained optimization problems with simple bounds are considered in the present work. First, a preconditioned spectral gradient method is defined for the case in which no simple bounds are present. This algorithm can be viewed as a quasiNewton method in which the approximate Hessians satisfy a weak secant equation. The spectral choice of steplength is embedded into the Hessian appro...
متن کاملUniversal gradient methods for convex optimization problems
In this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results...
متن کاملStep-size Estimation for Unconstrained Optimization Methods
Some computable schemes for descent methods without line search are proposed. Convergence properties are presented. Numerical experiments concerning large scale unconstrained minimization problems are reported. Mathematical subject classification: 90C30, 65K05, 49M37.
متن کاملGradient Methods with Adaptive Step-Sizes
Motivated by the superlinear behavior of the Barzilai-Borwein (BB) method for two-dimensional quadratics, we propose two gradient methods which adaptively choose a small step-size or a large step-size at each iteration. The small step-size is primarily used to induce a favorable descent direction for the next iteration, while the large step-size is primarily used to produce a sufficient reducti...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Signal Processing
سال: 2013
ISSN: 1053-587X,1941-0476
DOI: 10.1109/tsp.2013.2278149